A note on conjugate gradient convergence

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A unified convergence bound for conjugate gradient and accelerated gradient∗

Nesterov’s accelerated gradient method for minimizing a smooth strongly convex function f is known to reduce f(xk) − f(x∗) by a factor of ∈ (0, 1) after k ≥ O( √ L/` log(1/ )) iterations, where `, L are the two parameters of smooth strong convexity. Furthermore, it is known that this is the best possible complexity in the function-gradient oracle model of computation. The method of linear conju...

متن کامل

A Note on Hard Cases For Conjugate Gradient Method

The Conjugate Gradient (CG) method is often used to solve a positive definite linear system Ax = b. This paper analyzes two hard cases for CG or any Krylov subspace type methods by either analytically finding the residual formulas or tightly bound the residuals from above and below, in contrast to existing results which only bound residuals from above. The analysis is based on a general framewo...

متن کامل

Convergence Properties of Nonlinear Conjugate Gradient Methods

Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal [6]. They introduce a “sufficient descent condition” to establish global convergence results, whereas this condition is not needed in the convergence analyses of Newton and quasi-Newton methods, [6] hints that the sufficient descent condition, which was enforced by their ...

متن کامل

New results on the convergence of the conjugate gradient method

This paper is concerned with proving theoretical results related to the convergence of the Conjugate Gradient method for solving positive definite symmetric linear systems. New relations for ratios of the A-norm of the error and the norm of the residual are provided starting from some earlier results of Sadok [13]. These results use the well-known correspondence between the Conjugate Gradient m...

متن کامل

A Note on the Descent Property Theorem for the Hybrid Conjugate Gradient Algorithm CCOMB Proposed by Andrei

In [1] (Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization J. Optimization. Theory Appl. 141 (2009) 249 - 264), an efficient hybrid conjugate gradient algorithm, the CCOMB algorithm is proposed for solving unconstrained optimization problems. However, the proof of Theorem 2.1 in [1] is incorrect due to an erroneous inequality which used to indicate the descent property for the s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Numerische Mathematik

سال: 1997

ISSN: 0029-599X,0945-3245

DOI: 10.1007/s002110050260